Chapter 4 Nonparametric regression : minimax upper and lower bounds
نویسنده
چکیده
We consider one of the two the most classical non-parametric problems in this example: estimating a regression function on a subset of the real line (the most classical problem being estimation of a density). In non-parametric regression, we assume there is an unknown function f : R → R, where f belongs to a pre-determined class of functions F ; usually this class is parameterized by some type of smoothness guarantee. To make our problems concrete, we will assume that the unknown function f is L-Lipschitz and defined on [0, 1]. Let F denote this class. (For a fuller technical introduction into nonparametric estimation, see the book by Tsybakov [2].)
منابع مشابه
Minimax Risk Bounds in Extreme Value Theory
Asymptotic minimax estimators of a positive extreme value index under zero-one loss are investigated in the classical i.i.d. setup. To this end, we prove the weak convergence of suitable local experiments with Pareto distributions as center of localization to a white noise model, which was previously studied in the context of nonparametric local density estimation and regression. From this resu...
متن کاملOn rate optimality for ill-posed inverse problems in econometrics
In this paper, we clarify the relations between the existing sets of regularity conditions for convergence rates of nonparametric indirect regression (NPIR) and nonparametric instrumental variables (NPIV) regression models. We establish minimax risk lower bounds in mean integrated squared error loss for the NPIR and the NPIV models under two basic regularity conditions that allow for both mildl...
متن کاملExtended Stochastic Complexity and Minimax Relative Loss Analysis
We are concerned with the problem of sequential prediction using a given hypothesis class of continuously-many prediction strategies. An e ective performance measure is the minimax relative cumulative loss (RCL), which is the minimum of the worst-case di erence between the cumulative loss for any prediction algorithm and that for the best assignment in a given hypothesis class. The purpose of t...
متن کاملMinimax Estimation of Maximum Mean Discrepancy with Radial Kernels
Maximum Mean Discrepancy (MMD) is a distance on the space of probability measures which has found numerous applications in machine learning and nonparametric testing. This distance is based on the notion of embedding probabilities in a reproducing kernel Hilbert space. In this paper, we present the first known lower bounds for the estimation of MMD based on finite samples. Our lower bounds hold...
متن کاملLocal Privacy, Data Processing Inequalities, and Statistical Minimax Rates
Working under a model of privacy in which data remains private even from the statistician, we study the tradeoff between privacy guarantees and the utility of the resulting statistical estimators. We prove bounds on information-theoretic quantities, including mutual information and Kullback-Leibler divergence, that depend on the privacy guarantees. When combined with standard minimax techniques...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2014